Combining Multi-class SVMs with Linear Ensemble Methods that Estimate the Class Posterior Probabilities
نویسنده
چکیده
Roughly speaking, there is one main model of pattern recognition support vector machine, with several variants of lower popularity. On the contrary, among the di erent multi-class support vector machines which can be found in the literature, none is clearly favoured. On the one hand, they exhibit distinct statistical properties. On the other hand, multiple comparative studies between multi-class support vector machines and decomposition methods have highlighted the fact that each model has its advantages and drawbacks. These observations call for the evaluation of combinations of multi-class support vector machines. In this article, we study the combination of multi-class support vector machines with linear ensemble methods. Their sample complexity is low, which should prevent them from over tting, and the outputs of two of them are estimates of the class posterior probabilities.
منابع مشابه
Ensemble Methods of Appropriate Capacity for Multi-Class Support Vector Machines
Roughly speaking, there is one single model of pattern recognition support vector machine (SVM), with variants of lower popularity. On the contrary, among the different multi-class SVMs (M-SVMs) published, none is clearly favoured. Although several comparative studies between M-SVMs and decomposition methods have been reported, no attention had been paid so far to the combination of those model...
متن کاملWhich Is the Best Multiclass SVM Method? An Empirical Study
Multiclass SVMs are usually implemented by combining several two-class SVMs. The one-versus-all method using winner-takes-all strategy and the one-versus-one method implemented by max-wins voting are popularly used for this purpose. In this paper we give empirical evidence to show that these methods are inferior to another one-versusone method: one that uses Platt’s posterior probabilities toge...
متن کاملCombining protein secondary structure prediction models with ensemble methods of optimal complexity
Many sophisticated methods are currently available to perform protein secondary structure prediction. Since they are frequently based on di,erent principles, and di,erent knowledge sources, signi>cant bene>ts can be expected from combining them. However, the choice of an appropriate combiner appears to be an issue in its own right. The >rst di@culty to overcome when combining prediction methods...
متن کاملClassifier Combination Techniques and Support Vector Machine: An Application to Character Recognition
This paper presents a comparative evaluation of the performance of linear and non-linear techniques for combining the outputs of an ensemble of neural network classifiers, expressed as posterior probabilities, in the recognition of hand-written digits. The ensemble consists of Multi-layer Perceptron (MLP) modules, characterized by different topologies. Two different methods are used for the coe...
متن کاملMulti-Task Multi-Sample Learning
In the exemplar SVM (E-SVM) approach of Malisiewicz et al., ICCV 2011, an ensemble of SVMs is learnt, with each SVM trained independently using only a single positive sample and all negative samples for the class. In this paper we develop a multi-sample learning (MSL) model which enables joint regularization of the E-SVMs without any additional cost over the original ensemble learning. The adva...
متن کامل